55 research outputs found

    Capturing the zero: a new class of zero-augmented distributions and multiplicative error processes

    Get PDF
    We propose a novel approach to model serially dependent positive-valued variables which realize a non-trivial proportion of zero outcomes. This is a typical phenomenon in financial time series observed on high frequencies, such as cumulated trading volumes or the time between potentially simultaneously occurring market events. We introduce a flexible pointmass mixture distribution and develop a semiparametric specification test explicitly tailored for such distributions. Moreover, we propose a new type of multiplicative error model (MEM) based on a zero-augmented distribution, which incorporates an autoregressive binary choice component and thus captures the (potentially different) dynamics of both zero occurrences and of strictly positive realizations. Applying the proposed model to high-frequency cumulated trading volumes of liquid NYSE stocks, we show that the model captures both the dynamic and distribution properties of the data very well and is able to correctly predict future distributions. Keywords: High-frequency Data , Point-mass Mixture , Multiplicative Error Model , Excess Zeros , Semiparametric Specification Test , Market Microstructure JEL Classification: C22, C25, C14, C16, C5

    Capturing the Zero: A New Class of Zero-Augmented Distributions and Multiplicative Error Processes

    Get PDF
    We propose a novel approach to model serially dependent positive-valued variables which realize a non-trivial proportion of zero outcomes. This is a typical phenomenon in financial time series observed on high frequencies, such as cumulated trading volumes or the time between potentially simultaneously occurring market events. We introduce a flexible point-mass mixture distribution and develop a semiparametric specification test explicitly tailored for such distributions. Moreover, we propose a new type of multiplicative error model (MEM) based on a zero-augmented distribution, which incorporates an autoregressive binary choice component and thus captures the (potentially different) dynamics of both zero occurrences and of strictly positive realizations. Applying the proposed model to high-frequency cumulated trading volumes of liquid NYSE stocks, we show that the model captures both the dynamic and distribution properties of the data very well and is able to correctly predict future distributions.high-frequency data, point-mass mixture, multiplicative error model, excess zeros, semiparametric specification test, market microstructure

    Semiparametric Estimation with Generated Covariates

    Get PDF
    In this paper, we study a general class of semiparametric optimization estimators of a vector-valued parameter. The criterion function depends on two types of infinite-dimensional nuisance parameters: a conditional expectation function that has been estimated nonparametrically using generated covariates, and another estimated function that is used to compute the generated covariates in the first place. We study the asymptotic properties of estimators in this class, which is a nonstandard problem due to the presence of generated covariates. We give conditions under which estimators are root-n consistent and asymptotically normal, and derive a general formula for the asymptotic variance.semiparametric estimation, generated covariates, profiling, propensity score

    Financial Network Systemic Risk Contributions

    Get PDF
    We propose the systemic risk beta as a measure for financial companies’ contribution to systemic risk given network interdependence between firms’ tail risk exposures. Conditional on statistically pre-identified network spillover effects and market and balance sheet information, we define the systemic risk beta as the time-varying marginal effect of a firm’s Value-at-risk (VaR) on the system’s VaR. Suitable statistical inference reveals a multitude of relevant risk spillover channels and determines companies’ systemic importance in the U.S. financial system. Our approach can be used to monitor companies’ systemic importance allowing for a transparent macroprudential regulation.Systemic risk contribution, systemic risk network, Value at Risk, network topology, two-step quantile regression, time-varying parameters

    Nonparametric Estimation of Risk-Neutral Densities

    Get PDF
    This chapter deals with nonparametric estimation of the risk neutral density. We present three different approaches which do not require parametric functional assumptions on the underlying asset price dynamics nor on the distributional form of the risk neutral density. The first estimator is a kernel smoother of the second derivative of call prices, while the second procedure applies kernel type smoothing in the implied volatility domain. In the conceptually different third approach we assume the existence of a stochastic discount factor (pricing kernel) which establishes the risk neutral density conditional on the physical measure of the underlying asset. Via direct series type estimation of the pricing kernel we can derive an estimate of the risk neutral density by solving a constrained optimization problem. The methods are compared using European call option prices. The focus of the presentation is on practical aspects such as appropriate choice of smoothing parameters in order to facilitate the application of the techniques.Risk neutral density, Pricing kernel, Kernel smoothing, Local polynomials, Series methods

    Measuring connectedness of euro area sovereign risk

    Get PDF
    We introduce a method for measuring default risk connectedness of euro zone sovereign states using credit default swap (CDS) and bond data. The connectedness measure is based on an out-of-sample variance decomposition of model forecast errors. Due to its predictive nature, it can respond more quickly to crisis occurrences than common in-sample techniques. We determine sovereign default risk connectedness with both CDS and bond data for a more comprehensive picture of the system. We find evidence that several observable factors drive the difference of CDS and bonds, but both data sources still contain specific information for connectedness spill-overs. Generally, we can identify countries that impose risk on the system and the respective spill-over channels. In our empirical analysis we cover the years 2009-2014, such that recovery paths of countries exiting EU and IMF financial assistance schemes and responses to the ECB\u27s unconventional policy measures can be analyzed

    Measuring Connectedness of Euro Area Sovereign Risk

    Get PDF
    We introduce a methodology for measuring default risk connectedness that is based on an out-of-sample variance decomposition of model forecast errors. The out-of-sample nature of the procedure leads to "realized" measures which, in practice, respond more quickly to crisis occurrences than those based on in-sample methods. The resulting relative and absolute connectedness measures find distinct and complementary information from CDS and bond yield data on European area sovereign risk. The detection and use of these second moment differences of CDS and bond data is new to the literature and allows to identify countries that impose risk on the system from those which sustain risk

    Testing for an omitted multiplicative long-term component in GARCH models

    Get PDF
    We consider the problem of testing for an omitted multiplicative long-term component in GARCH-type models. Under the alternative there is a two-component model with a short-term GARCH component that uctuates around a smoothly time-varying long-term component which is driven by the dynamics of an explanatory variable. We suggest a Lagrange Multiplier statistic for testing the null hypothesis that the variable has no explanatory power. We derive the asymptotic theory for our test statistic and investigate its finite sample properties by Monte-Carlo simulation. Our test also covers the mixed-frequency case in which the returns are observed at a higher frequency than the explanatory variable. The usefulness of our procedure is illustrated by empirical applications to S&P 500 return data

    Nonparametric nonstationary regression

    Get PDF
    This thesis studies nonparametric estimation techniques for a general regression set–up under very weak conditions on the covariate process. In particular, regressors are allowed to be high–dimensional stochastically nonstationary processes. The concept of nonstationarity comprises time series observations of random walk or long memory type. Admissible processes are ß–null Harris recurrent processes. We introduce the first kernel type estimation method for such nonstationary regressors without restricting their dimension. This set–up is motivated by and generalizes approaches in parametric econometric time series analysis with nonstationary components. Additive regression allows to circumvent the usual nonparametric curse of dimensionality and to countervail the additionally present, nonstationary curse of dimensionality while still pertaining high modeling flexibility. We propose a backfitting type estimation procedure where it is sufficient that the response Y and all univariate Xj and pairs of bivariate marginal components Xjk of the vector of all covariates X are (potentially nonstationary) ß–null Harris recurrent processes. The full dimensional vector of regressors X itself, however, is not required to be Harris recurrent. This is particularly important since e.g. random walks are Harris recurrent only up to dimension two. Though we also suggest tailored estimation methods if more than two covariates are ß–null Harris recurrent, which can provide the best additive fit in a projection sense to a more general non-additive class of true models. Under different types of independence assumptions, asymptotic distributions are derived for the general case of a (potentially nonstationary) ß-null Harris recurrent residual but also for the special case of the residual being stationary mixing. The later case deserves special attention, since the model might be regarded as an additive type of cointegration model. In contrast to existing more general approaches, the number of cointegrated regressors is not restricted. In general, we need no pre-testing for stationary or nonstationary covariates, the same method applies to all. Though if we have pre-information that some covariates are stationary and we know which ones, we present a tailored estimation procedure which allows to obtain faster rates of convergence fully countervailing the nonstationary curse of dimensionality. In particular, we manage to estimate stationary components at stationary rates which confirms the practical relevance of the procedure. Furthermore finite sample properties are discussed in a simulation study
    • …
    corecore